7.5 Noise

87

The concept of equivocation enables one to write the actual rate of information

transmission script upper RR over a noisy channel in a rather transparent way:

script upper R equals upper I left parenthesis x right parenthesis minus upper E semicolonR = I (x)E ;

(7.12)

that is, the rate equals the rate of transmission of the original signal minus the uncer-

tainty in what was sent when the message received is known. From our definition

(7.11),

script upper R equals upper I left parenthesis y right parenthesis minus upper I Subscript x Baseline left parenthesis y right parenthesis commaR = I (y)Ix(y) ,

(7.13)

whereupper I Subscript x Baseline left parenthesis y right parenthesisIx(y) is the spurious part of the information received (i.e., the part due to noise)

or, equivalently, the average uncertainty in a message received when the signal sent

is known. It follows (cf. Sect. 8.1) that

script upper R equals upper I left parenthesis x right parenthesis plus upper I left parenthesis y right parenthesis minus upper I left parenthesis x comma y right parenthesis commaR = I (x) + I (y)I (x, y) ,

(7.14)

where upper I left parenthesis x comma y right parenthesisI (x, y) is the joint entropy of input (information transmitted) and output

(information received). By symmetry, the joint entropy equals

upper I left parenthesis x comma y right parenthesis equals upper I left parenthesis x right parenthesis minus upper I Subscript x Baseline left parenthesis y right parenthesis equals upper I left parenthesis y right parenthesis minus upper I Subscript y Baseline left parenthesis x right parenthesis periodI (x, y) = I (x)Ix(y) = I (y)Iy(x) .

(7.15)

We could just as well write upper EE as upper I Subscript y Baseline left parenthesis x right parenthesisIy(x): it is the uncertainty in what was sent when

it is known what was received. If there is no noise, upper I left parenthesis y right parenthesis equals upper I left parenthesis x right parenthesisI (y) = I (x) and upper E equals 0E = 0.

Let the error rate be etaη per symbol. Then

upper E equals upper I Subscript y Baseline left parenthesis x right parenthesis equals eta log eta plus left parenthesis 1 minus eta right parenthesis log left parenthesis 1 minus eta right parenthesis periodE = Iy(x) = η log η + (1η) log(1η) .

(7.16)

The maximum error rate is 0.5 for a binary transmission; the equivocation is then 1

bit/symbol and the rate of information transmission is zero.

The equivocation is just the conditional or relative entropy and can also be derived

using conditional probabilities. Let p left parenthesis i right parenthesisp(i) be the probability of the iith symbol being

transmitted and let p left parenthesis j right parenthesisp( j) be the probability of the j jth symbol being received. p left parenthesis j vertical bar i right parenthesisp( j|i)

is the conditional probability of the j jth signal being received when the iith was

transmitted, p left parenthesis i vertical bar j right parenthesisp(i| j) is the conditional probability of the iith signal being transmitted

when thej jth was received (posterior probability), andp left parenthesis i comma j right parenthesisp(i, j) is the joint probability

of the iith signal being transmitted and the j jth received.

The ignorance removed by the arrival of one symbol is (cf. Eq. 6.7)

StartLayout 1st Row 1st Column upper I 2nd Column equals 3rd Column initial uncertainty minus final uncertainty 2nd Row 1st Column Blank 2nd Column equals 3rd Column log p left parenthesis i right parenthesis minus left parenthesis minus log p left parenthesis j right parenthesis right parenthesis 3rd Row 1st Column Blank 2nd Column equals 3rd Column log StartFraction p left parenthesis i vertical bar j right parenthesis Over p left parenthesis i right parenthesis EndFraction period EndLayoutI

=

initial uncertaintyfinal uncertainty

=

log p(i)(log p( j))

=

log p(i| j)

p(i)

.

(7.17)